Supervised Feature Selection With Orthogonal Regression and Feature Weighting

نویسندگان

چکیده

Effective features can improve the performance of a model and help us understand characteristics underlying structure complex data. Previously proposed feature selection methods usually cannot retain more discriminative information. To address this shortcoming, we propose novel supervised orthogonal least square regression with weighting for selection. The optimization problem objective function be solved by employing generalized power iteration augmented Lagrangian multiplier methods. Experimental results show that method effectively reduce dimensionality obtain better classification than traditional convergence our iterative is also proved. Consequently, effectiveness superiority are verified both theoretically experimentally.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Orthogonal Principal Feature Selection

This paper presents a feature selection method based on the popular transformation approach: principal component analysis (PCA). It is popular because it finds the optimal solution to several objective functions (including maximum variance and minimum sum-squared-error), and also because it provides an orthogonal basis solution. However, PCA as a dimensionality reduction algorithm do not explic...

متن کامل

Supervised Infinite Feature Selection

In this paper, we present a new feature selection method that is suitable for both unsupervised and supervised problems. We build upon the recently proposed Infinite Feature Selection (IFS) method where feature subsets of all sizes (including infinity) are considered. We extend IFS in two ways. First, we propose a supervised version of it. Second, we propose new ways of forming the feature adja...

متن کامل

Discrete feature weighting & selection algorithm

A new method of feature weighting, useful also for feature selection has been described. It is quite efficient and gives quite accurate results. In general weighting algorithm may be used with any kind of learning algorithm. The weighting algorithm with k-nearest neighbors model was used to estimate the optimal feature base for a given distance measure. Results obtained with this algorithm clea...

متن کامل

Graph Laplacian for Semi-supervised Feature Selection in Regression Problems

Feature selection is fundamental in many data mining or machine learning applications. Most of the algorithms proposed for this task make the assumption that the data are either supervised or unsupervised, while in practice supervised and unsupervised samples are often simultaneously available. Semi-supervised feature selection is thus needed, and has been studied quite intensively these past f...

متن کامل

Semi-supervised Feature Selection via Rescaled Linear Regression

With the rapid increase of complex and highdimensional sparse data, demands for new methods to select features by exploiting both labeled and unlabeled data have increased. Least regression based feature selection methods usually learn a projection matrix and evaluate the importances of features using the projection matrix, which is lack of theoretical explanation. Moreover, these methods canno...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on neural networks and learning systems

سال: 2021

ISSN: ['2162-237X', '2162-2388']

DOI: https://doi.org/10.1109/tnnls.2020.2991336